Apple Removes Viral “Tea” and “TeaOnHer” Dating Apps Following Privacy and Moderation Failures

Posted on October 23, 2025 at 09:43 PM

Apple Removes Viral “Tea” and “TeaOnHer” Dating Apps Following Privacy and Moderation Failures

When an app marketed as a “dating safety tool” sparks controversy for allegedly enabling doxxing and exposing minors, the tech world takes notice. Such was the case with the apps Tea and TeaOnHer, which the tech giant Apple confirmed has removed from its App Store following escalating safety and privacy concerns. ([TechCrunch][1])


What happened

Apple confirmed the removal of both apps—Tea (also known as “Tea Dating Advice”) and TeaOnHer—from its App Store globally. The move came after probes by app-intelligence firm Appfigures revealed the apps were still live on the Android-based Google Play Store but removed from Apple’s store. ([TechCrunch][1])

The removal was triggered by several factors:

  • A high volume of user complaints and negative reviews directed at both apps. ([TechCrunch][1])
  • Allegations of minors’ personal information being posted on the platforms. ([TechCrunch][1])
  • A data breach affecting Tea: “Hackers gaining access to 72,000 images, including 3,000 selfies and photo IDs submitted for account verification, as well as 59,000 images from posts, comments, and direct messages.” ([TechCrunch][1])
  • Violations of Apple’s App Review Guidelines, specifically:

    • Rule 1.2 (apps with user-generated content must provide reporting/blocking tools). ([TechCrunch][1])
    • Rule 5.1.2 (apps mustn’t misuse or share personal information without consent). ([TechCrunch][1])
    • Rule 5.6 (excessive negative user feedback can indicate breach of developer code of conduct). ([TechCrunch][1])
  • The developers reportedly failed to adequately address Apple’s concerns after being notified. ([TechCrunch][1])

Interestingly, the apps had generated significant adoption: Tea had around 6.1 million lifetime downloads and $5 million in gross revenue; TeaOnHer had about 2.2 million downloads and no in-app purchases. ([TechCrunch][1])


Why this matters

As digital dating platforms proliferate, the case underscores how quickly “safety-first” branding can backfire when underlying moderation and privacy structures fail. The apps claimed to help women(sic) evaluate potential matches by allowing reviews of men (“green flag” or “red flag”), and the counterpart app allowed the reverse. But the public debate shifted to privacy, defamation and the risks of user-generated content turning harmful. ([TechCrunch][1])

For Apple, the action signals that high-volume complaints and repeated guideline violations will trigger removal—even for apps already popular and monetising. It reinforces the role of platform governance and the limits on letting apps self-police when harms escalate.

For other platforms like Google Play, where the apps remain live as of the article’s publication, it raises questions about cross-platform consistency and the potential for “app-hop” migration when one store clamps down.


Broader implications and takeaways

  • Platform accountability: App stores have increasing responsibility to monitor apps—not just at launch, but throughout lifecycle—especially for user‐generated content and safety risks.
  • Safety branding ≠ safety operations: Marketing an app as “for safety” isn’t enough; the operational fundamentals (moderation, privacy safeguards, breach response) must be robust.
  • User privacy and defamation concerns: Allowing users to post personal data or evaluate others in public forums opens risks of defamation, harassment or exposure of minors.
  • Cross-platform inconsistency: Removal from one store doesn’t guarantee removal everywhere—which may leave a patchwork of regulation.
  • Revenue ≠ immunity: Despite millions of downloads and significant revenue, the apps were still removed once violations were confirmed. Monetisation alone won’t shield non-compliance.

Glossary

User-generated content (UGC): Content created and posted by users of a platform (e.g., profiles, comments, images) rather than the app developer. Apps hosting UGC typically face higher moderation obligations. Do-xing / Doxxing: The act of publishing private or identifying information about a person on the internet, often without their consent, which can lead to harassment or other harm. App Review Guidelines (Apple): A set of rules published by Apple that developers must adhere to if their app is to be approved for listing on the App Store. These include privacy, safety, user-generated content, and other conduct standards. Defamation: The act of communicating false statements about a person that harm their reputation. In the context of apps, user-posted evaluations could raise legal risk if inaccurate or malicious. App Store Intelligence: Research & analytics service that monitors app store listings, removals, downloads, revenue, etc., often used by media or competitors to track trends.


In closing

The removal of Tea and TeaOnHer by Apple serves as a cautionary tale—for app developers, for platform gatekeepers and for users alike. It shows how quickly a “viral” app can become a liability when moderation, privacy and safety don’t keep pace with hype.

Source: https://techcrunch.com/2025/10/22/apple-confirms-it-pulled-controversial-dating-apps-tea-and-teaonher-from-the-app-store/

[1]: https://techcrunch.com/2025/10/22/apple-confirms-it-pulled-controversial-dating-apps-tea-and-teaonher-from-the-app-store/ “Apple confirms it pulled controversial dating apps Tea and TeaOnHer from the App Store TechCrunch”